CCG Categories for Distributional Semantic Models

نویسندگان

  • Paramita Mirza
  • Raffaella Bernardi
چکیده

For the last decade, distributional semantics has been an active area of research to address the problem of understanding the semantics of words in natural language. The core principal of the distributional semantic approach is that the linguistic context surrounding a given word, which is represented as a vector, provides important information about its meaning. In this paper we investigate the possibility to exploit Combinatory Categorial Grammar (CCG) categories as syntactic features to be relevant for characterizing the context vector and hence the meaning of words. We find that the CCG categories can enhance the representation of verb meaning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploiting CCG Derivations within Distributional Semantic Models

For the last decade, Distributional Semantic Models (DSMs) have been an active area of research to address the problem of understanding the semantics of words in natural language. The central assumption which functions as the core of DSMs is that the linguistic context surrounding a given word provides important information about its meaning, and hence word co-occurrence statistics can provide ...

متن کامل

Redundancy in Perceptual and Linguistic Experience: Comparing Feature-Based and Distributional Models of Semantic Representation

Since their inception, distributional models of semantics have been criticized as inadequate cognitive theories of human semantic learning and representation. A principal challenge is that the representations derived by distributional models are purely symbolic and are not grounded in perception and action; this challenge has led many to favor feature-based models of semantic representation. We...

متن کامل

A Type-Driven Tensor-Based Semantics for CCG

This paper shows how the tensor-based semantic framework of Coecke et al. can be seamlessly integrated with Combinatory Categorial Grammar (CCG). The integration follows from the observation that tensors are linear maps, and hence can be manipulated using the combinators of CCG, including type-raising and composition. Given the existence of robust, wide-coverage CCG parsers, this opens up the p...

متن کامل

Vector Space Semantic Parsing: A Framework for Compositional Vector Space Models

We present vector space semantic parsing (VSSP), a framework for learning compositional models of vector space semantics. Our framework uses Combinatory Categorial Grammar (CCG) to define a correspondence between syntactic categories and semantic representations, which are vectors and functions on vectors. The complete correspondence is a direct consequence of minimal assumptions about the sema...

متن کامل

Running head: SEMANTIC COHERENCE AND DISTRIBUTIONAL LEARNING 1 Semantic coherence facilitates distributional learning

Computational models have shown that purely statistical knowledge about words’ linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that “postman” and “mailman” are semantically similar because they have quantitatively similar patterns of association with other words (e.g., they both tend to occur with word...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013